Should artificial intelligence (AI) technology be used in warfare? Can robots be authorized to kill people? This is a hot topic repeatedly controversial in foreign scientific and technological academic circles. A report in the US "Wall Street Journal" on the 29th put forward a completely opposite view of scientists, once again arousing people's attention and worries about military artificial intelligence technology and even intelligent warfare. Killing or kindness? Regarding the use of artificial intelligence for military purposes and even on the battlefield, it has always been a decision strongly opposed by many foreign scientific and technological experts. Scientists warn that intelligent robots are vulnerable to hackers, and they may be hijacked to attack innocent people. Last year, 116 global experts in the field of artificial intelligence and robotics issued a joint open letter, calling on the United Nations to take action to ban "autonomous killing machines", calling them "terrorist weapons." In April of this year, more than 3,000 Google employees signed a letter to Google CEO Sandor Pichai, opposing the company’s participation in the Pentagon’s military artificial intelligence Project Maven. The project uses robotic deep learning and artificial intelligence analysis, and applies computer vision technology to help the US Department of Defense extract and identify key targets from images and videos. Google executives were forced to make a decision that the Project Maven contract will expire in 2019 and will no longer be renewed with the Pentagon. 2,400 robotics researchers and engineers, including Musk, CEO of the US Space Exploration Technology Corporation, also sworn in Stockholm recently that they "will never participate in the development and development of lethal autonomous weapon systems." But unlike these views against intelligent weapons, the US "Wall Street Journal" put forward completely opposite opinions on the 29th. The report believes that from carpet bombing, high-power nuclear bombs to precision-guided weapons, humans are getting closer and closer to precision killing. "Intelligent weapons will only reduce accidental injuries and collateral damage in wars, and will not bring greater casualties." The report specifically emphasized that although U.S. drones have accidentally injured civilians on many occasions in recent years, this is not the fault of intelligent weapons, but intelligence has identified a problem. The Pentagon has formed an algorithmic combat cross-functional team and developed the Project Maven project to more accurately identify and kill terrorists and other enemies to prevent accidental injury to innocent civilians. Therefore, Google "if it is really for peace, it should prepare for war." What can military AI do? Artificial intelligence technology is far more than just killing robots for military and even warfare. The most rudimentary AI technology has long been applied in many air defense and anti-missile early warning systems. Many air defense and missile defense system operators and systems cooperate and identify and confirm enemy targets, launch interceptor missiles, and then the system autonomously detects and Intercept the target. In recent years, AI technology developed by relying on big data has made various armed robots and drones the embryonic form of intelligent weapons, which is also the most opposed goal of scientists. The U.S. Army, Navy, Air Force and Marine Corps have developed a variety of intelligent weapon platforms, and the U.S. X-47B UAV has successfully taken off and landed on multiple aircraft carriers. The U.S. Army and Marine Corps have also developed wingmen and wingmen for the M1A2 main battle tank and the Apache armed helicopter respectively, trying to develop a new combat model with "manned and unmanned." In terms of higher-level war command and operational planning, artificial intelligence is also constantly making breakthroughs. For example, in the past, planning a 20-minute large-scale air battle required 40 to 50 people and took 12 hours. With the assistance of AI, the plan is expected to be completed within an hour. According to reports, the Russian army is applying AI and big data technology to simulate software instead of soldiers to make combat decisions in order to respond to the thunderous military strikes. The intelligent arms race has begun In 2008, the US Naval Research Office issued a report acknowledging that autonomous robots used in warfare cannot bear legal responsibility. "At present, robots cannot distinguish between civilians and soldiers, and cannot be held responsible for their actions. In future unmanned wars, the combatants will stay away from the battlefield, and unmanned weapons will kill people in a'dehumanized' way, and neither understand The ethics of taking on the dangers of war does not have the emotional experience of the suffering of war." The human doomsday scene similar to the movie "Terminator" is the ultimate nightmare that many scientists worry about. However, as of now, there are no specific regulations on military intelligence technology in the world, and the international community has not explicitly supported a restriction or prohibition of such weapons systems. Treaty. It is worth noting that the "third offset strategy" launched by the US military believes that a military revolution marked by intelligent military, autonomous equipment and unmanned warfare is approaching. The US military plans to initially establish an intelligent combat system by 2035. , To form a new military "generation gap" against major opponents. By 2050, the U.S. military's combat platforms, information systems, and command and control will be fully intelligent and even unmanned, realizing a true "robot war". The US Department of Defense stated in August last year that future AI wars are inevitable and the United States needs to "take immediate action" to accelerate the development of AI warfare technology. Russia is also developing multiple intelligent unmanned combat platform projects. The Russian "Uranus-6" unmanned minesweeping vehicle has been put into use in the Syrian war. Sukhoi's "Hunter" long-range heavy drone will also fly for the first time by the end of this year. According to historical experience, it is difficult for human reason to stop the militarization of new technologies. The arrival of the era of military intelligence may only be a matter of time. Half-cell Polycrystalline Solar Panel Half-Cell Polycrystalline Solar Panel,340W Half Cell Solar Panel,High Efficiency Polycrystalline Solar Cell Panel,Half-Cell Poly Solar Panel Jiangsu Stark New Energy Co.,Ltd , https://www.stark-newenergy.com